5 research outputs found
Decision Algorithms for Ostrowski-Automatic Sequences
We extend the notion of automatic sequences to a broader class, the Ostrowski-automatic sequences. We develop a procedure for computationally deciding certain combinatorial and enumeration questions about such sequences that can be expressed as predicates in first-order logic.
In Chapter 1, we begin with topics and ideas that are preliminary to this work, including a small introduction to non-standard positional numeration systems and the relationship between words and automata. In Chapter 2, we define the theoretical foundations for recognizing addition in a generalized Ostrowski numeration system and formalize the general theory that develops our decision procedure. Next, in Chapter 3, we show how to implement these ideas in practice, and provide the implementation as an integration to the automatic theorem-proving software package -- Walnut.
Further, we provide some applications of our work in Chapter 4. These applications span several topics in combinatorics on words, including repetitions, pattern-avoidance, critical exponents of special classes of words, properties of Lucas words, and so forth. Finally, we close with open problems on decidability and higher-order numeration systems and discuss future directions for research
Effects of Graph Convolutions in Deep Networks
Graph Convolutional Networks (GCNs) are one of the most popular architectures
that are used to solve classification problems accompanied by graphical
information. We present a rigorous theoretical understanding of the effects of
graph convolutions in multi-layer networks. We study these effects through the
node classification problem of a non-linearly separable Gaussian mixture model
coupled with a stochastic block model. First, we show that a single graph
convolution expands the regime of the distance between the means where
multi-layer networks can classify the data by a factor of at least
, where denotes the
expected degree of a node. Second, we show that with a slightly stronger graph
density, two graph convolutions improve this factor to at least
, where is the number of nodes in the graph. Finally, we
provide both theoretical and empirical insights into the performance of graph
convolutions placed in different combinations among the layers of a network,
concluding that the performance is mutually similar for all combinations of the
placement. We present extensive experiments on both synthetic and real-world
data that illustrate our results.Comment: 36 pages, 8 figure
Antisquares and Critical Exponents
The complement of a binary word is obtained by changing each
in to and vice versa. An antisquare is a nonempty word of the form
. In this paper, we study infinite binary words that do not
contain arbitrarily large antisquares. For example, we show that the repetition
threshold for the language of infinite binary words containing exactly two
distinct antisquares is . We also study repetition thresholds
for related classes, where "two" in the previous sentence is replaced by a
large number.
We say a binary word is good if the only antisquares it contains are and
. We characterize the minimal antisquares, that is, those words that are
antisquares but all proper factors are good. We determine the the growth rate
of the number of good words of length and determine the repetition
threshold between polynomial and exponential growth for the number of good
words